Feature selection in jump models

نویسندگان

چکیده

Jump models switch infrequently between states to fit a sequence of data while taking the ordering into account We propose new framework for joint feature selection, parameter and state-sequence estimation in jump models. Feature selection is necessary high-dimensional settings where number features large compared observations underlying differ only with respect subset features. develop implement coordinate descent algorithm that alternates selecting estimating model parameters state sequence, which scales sets numbers (noisy) demonstrate usefulness proposed by comparing it other methods on both simulated real form financial returns, protein sequences, text. By leveraging information embedded data, resulting sparse outperforms all considered remarkably robust noise. • Joint Determine are responsible differences states. Coordinate noisy Evaluation Sparse

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

An Overview of the New Feature Selection Methods in Finite Mixture of Regression Models

Variable (feature) selection has attracted much attention in contemporary statistical learning and recent scientific research. This is mainly due to the rapid advancement in modern technology that allows scientists to collect data of unprecedented size and complexity. One type of statistical problem in such applications is concerned with modeling an output variable as a function of a sma...

متن کامل

Robust feature selection using probabilistic union models

This paper provides a summary of our recent work on robust speech recognition based on a new statistical approach the probabilistic union model. In particular, we considered speech recognition involving partial corruption in frequency bands, in time duration, and further in feature components. In all these situations, we assumed no prior knowledge about the corrupting noise, e.g. its band locat...

متن کامل

Feature Selection and Hypothesis Selection Models of Induction

Wells, H. (1963). Effects of transfer and problem structure in disjunctive concept formation. Table 2. Mean number of trials required by POSTHOC In POSTHOC, we have focused on how prior knowledge influences learning rates and we have so far ignored other information used by human learners (e.g., perceptual salience of features, Bower & Trabasso, 1968). As a consequence, POSTHOC is not intended ...

متن کامل

Fast SFFS-Based Algorithm for Feature Selection in Biomedical Datasets

Biomedical datasets usually include a large number of features relative to the number of samples. However, some data dimensions may be less relevant or even irrelevant to the output class. Selection of an optimal subset of features is critical, not only to reduce the processing cost but also to improve the classification results. To this end, this paper presents a hybrid method of filter and wr...

متن کامل

Proposed Feature Selection for Dynamic Thermal Management in Multicore Systems

Increasing the number of cores in order to the demand of more computing power has led to increasing the processor temperature of a multi-core system. One of the main approaches for reducing temperature is the dynamic thermal management techniques. These methods divided into two classes, reactive and proactive. Proactive methods manage the processor temperature, by forecasting the temperature be...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Expert Systems With Applications

سال: 2021

ISSN: ['1873-6793', '0957-4174']

DOI: https://doi.org/10.1016/j.eswa.2021.115558